21 research outputs found
Tribes Is Hard in the Message Passing Model
We consider the point-to-point message passing model of communication in
which there are processors with individual private inputs, each -bit
long. Each processor is located at the node of an underlying undirected graph
and has access to private random coins. An edge of the graph is a private
channel of communication between its endpoints. The processors have to compute
a given function of all their inputs by communicating along these channels.
While this model has been widely used in distributed computing, strong lower
bounds on the amount of communication needed to compute simple functions have
just begun to appear. In this work, we prove a tight lower bound of
on the communication needed for computing the Tribes function,
when the underlying graph is a star of nodes that has leaves with
inputs and a center with no input. Lower bound on this topology easily implies
comparable bounds for others. Our lower bounds are obtained by building upon
the recent information theoretic techniques of Braverman et.al (FOCS'13) and
combining it with the earlier work of Jayram, Kumar and Sivakumar (STOC'03).
This approach yields information complexity bounds that is of independent
interest
Towards Better Separation between Deterministic and Randomized Query Complexity
We show that there exists a Boolean function which observes the following
separations among deterministic query complexity , randomized zero
error query complexity and randomized one-sided error query
complexity : and
. This refutes the conjecture made by Saks
and Wigderson that for any Boolean function ,
. This also shows widest separation between
and for any Boolean function. The function was defined by
G{\"{o}}{\"{o}}s, Pitassi and Watson who studied it for showing a separation
between deterministic decision tree complexity and unambiguous
non-deterministic decision tree complexity. Independently of us, Ambainis et al
proved that different variants of the function certify optimal (quadratic)
separation between and , and polynomial separation between
and . Viewed as separation results, our results are subsumed
by those of Ambainis et al. However, while the functions considerd in the work
of Ambainis et al are different variants of , we work with the original
function itself.Comment: Reference adde
Weighted Min-Cut: Sequential, Cut-Query and Streaming Algorithms
Consider the following 2-respecting min-cut problem. Given a weighted graph
and its spanning tree , find the minimum cut among the cuts that contain
at most two edges in . This problem is an important subroutine in Karger's
celebrated randomized near-linear-time min-cut algorithm [STOC'96]. We present
a new approach for this problem which can be easily implemented in many
settings, leading to the following randomized min-cut algorithms for weighted
graphs.
* An -time sequential algorithm:
This improves Karger's and bounds when the input graph is not extremely
sparse or dense. Improvements over Karger's bounds were previously known only
under a rather strong assumption that the input graph is simple [Henzinger et
al. SODA'17; Ghaffari et al. SODA'20]. For unweighted graphs with parallel
edges, our bound can be improved to .
* An algorithm requiring cut queries to compute the min-cut of
a weighted graph: This answers an open problem by Rubinstein et al. ITCS'18,
who obtained a similar bound for simple graphs.
* A streaming algorithm that requires space and
passes to compute the min-cut: The only previous non-trivial exact min-cut
algorithm in this setting is the 2-pass -space algorithm on simple
graphs [Rubinstein et al., ITCS'18] (observed by Assadi et al. STOC'19).
In contrast to Karger's 2-respecting min-cut algorithm which deploys
sophisticated dynamic programming techniques, our approach exploits some cute
structural properties so that it only needs to compute the values of cuts corresponding to removing pairs of tree edges, an
operation that can be done quickly in many settings.Comment: Updates on this version: (1) Minor corrections in Section 5.1, 5.2;
(2) Reference to newer results by GMW SOSA21 (arXiv:2008.02060v2), DEMN
STOC21 (arXiv:2004.09129v2) and LMN 21 (arXiv:2102.06565v1
Low-Carbohydrate High-Fat (LCHF) Diet: Evidence of Its Benefits
Current dietary recommendations state that there is insufficient evidence to prescribe an exact percentage of calories from carbohydrate, protein and fat for people with diabetes from the choice of a variety of popular diets currently available. Over the years, many a research has focused on the relative importance of the right proportion of carbohydrates and fat combination in a balanced diabetic diet. Jury is still out regarding the relative merits and demerits of a diabetic diet – low carbohydrate, high fat or low fat, high carbohydrate diet. Evidence from various studies suggest that low carbohydrate diets improve cardiovascular (CVD) risk through lowering HbA1c levels, improving blood pressure and body weight. There is also a positive effect on lipid profile and reversal of non-alcoholic fatty liver disease (NAFLD). Whilst there are some significant metabolic benefits of LCHF diet, it is accepted that there needs to be more long-term studies before it can be used in daily clinical practice.This chapter focuses on basic physiology and metabolism of carbohydrate and fat content in normal and diabetic patients and a review of the literature on these two diet combinations with current thoughts and evidence on this core issue affecting insulin utilization and metabolic profile
Lower Bounds for Elimination via Weak Regularity
We consider the problem of elimination in communication complexity, that was first raised by Ambainis et al. and later studied by Beimel et al. for its connection to the famous direct sum question. In this problem, let f: {0,1}^2n -> {0,1} be any boolean function. Alice and Bob get k inputs x_1, ..., x_k and y_1, ..., y_k respectively, with x_i,y_i in {0,1}^n. They want to output a k-bit vector v, such that there exists one index i for which v_i is not equal f(x_i,y_i). We prove a general result lower bounding the randomized communication complexity of the elimination problem for f using its discrepancy. Consequently, we obtain strong lower bounds for the functions Inner-Product and Greater-Than, that work for exponentially larger values of k than the best previous bounds.
To prove our result, we use a pseudo-random notion called regularity that was first used by Raz and Wigderson. We show that functions with small discrepancy are regular. We also observe that a weaker notion, that we call weak-regularity, already implies hardness of elimination. Finally, we give a different proof, borrowing ideas from Viola, to show that Greater-Than is weakly regular
Work-Optimal Parallel Minimum Cuts for Non-Sparse Graphs
We present the first work-optimal polylogarithmic-depth parallel algorithm
for the minimum cut problem on non-sparse graphs. For
for any constant , our algorithm requires work and
depth and succeeds with high probability. Its work matches the
best runtime for sequential algorithms [MN STOC 2020, GMW SOSA
2021]. This improves the previous best work by Geissmann and Gianinazzi [SPAA
2018] by factor, while matching the depth of their algorithm. To
do this, we design a work-efficient approximation algorithm and parallelize the
recent sequential algorithms [MN STOC 2020; GMW SOSA 2021] that exploit a
connection between 2-respecting minimum cuts and 2-dimensional orthogonal range
searching.Comment: Updates on this version: Minor corrections for the previous and our
resul
Nearly Optimal Communication and Query Complexity of Bipartite Matching
We settle the complexities of the maximum-cardinality bipartite matching
problem (BMM) up to poly-logarithmic factors in five models of computation: the
two-party communication, AND query, OR query, XOR query, and quantum edge query
models. Our results answer open problems that have been raised repeatedly since
at least three decades ago [Hajnal, Maass, and Turan STOC'88; Ivanyos, Klauck,
Lee, Santha, and de Wolf FSTTCS'12; Dobzinski, Nisan, and Oren STOC'14; Nisan
SODA'21] and tighten the lower bounds shown by Beniamini and Nisan [STOC'21]
and Zhang [ICALP'04]. We also settle the communication complexity of the
generalizations of BMM, such as maximum-cost bipartite -matching and
transshipment; and the query complexity of unique bipartite perfect matching
(answering an open question by Beniamini [2022]). Our algorithms and lower
bounds follow from simple applications of known techniques such as cutting
planes methods and set disjointness.Comment: Accepted in FOCS 202